They basically stopped short of calling the scientific method a cultural construct, at which point I'm sure I would have snapped.
I can't resist...
Did scientific method grow on a tree, or did people invent it?
Did people invent scientific method simultaneously everywhere, or was it invented and practiced at specific places?
:D
The real fallacy in my opinion is having a connotation that if something is constructed and promoted within a culture, that makes it wrong. For example, consider the Pythagorean theorem... knowing that Pythagoras was a rich white cis male, shouldn't we remove it from the curriculum? And perhaps replace it with something more enlightened, such as: "all sides of a triangle are equal, even if their lengths may be different".
In the same sense, science, even rationality itself, are cultural constructs. Maybe even human speech is a cultural construct, but luckily that happened sufficiently long ago so now all cultures have it. Okay, I am not sure about the last example. But I am sure that calling things "cultural constructs" is a cultural construct itself.
And perhaps replace it with something more enlightened, such as: "all sides of a triangle are equal, even if their lengths may be different".
The scientific method is a cultural construct, but one that yields nice things such as iPhones and reasonably accurate theories of physics. Of course, it also helps produce nasty things like atomic bombs.
I think the real fallacy is saying that the scientific method is just as good as any other method at finding truth.
A few notes:
Moral relativism and metaethics in general is unrelated to the scientific method, I hope you can figure out why and maybe discuss it the next time.
You appear to make a sharp division between you (the enLWightened) and "them" (the unwashed). Given that "the need to detect [the biases] in ourselves", how much effort and time have you put into describing your own experiences?
Given the apparent failure of this last class, can you identify your personal bias or a fallacy which resulted in you being blindsided by this failure?
Consider starting small, with short, clear and engaging examples, like the Newcomb's problem, the PD or the Trolley problem, or the Milgram or Stanford experiments
A common problem of novice instructors is to cram a lot more material in one class than the students can conceivably absorb. This is because we tend to underestimate how hard something is to learn after we internalized it. After all, it looks so clear now! Consider reducing the amount of material you plan to present and go over more examples instead.
If you know your audience well, consider modeling their reactions to what you say, given their level of understanding, interest and skepticism, then plan for contingencies, like how to get a sidelined discussion back on track without being heavy-handed.
Good luck!
In retrospect this was almost inevitable. Bias means one thing in modern society.
Taboo bias and try again?
Perhaps they could be called "errors", errors we have systematic tendencies to make, and when describing them, explain every time why they are errors, why they fail to cut through to the truth. Then people may not find it so easy to interpret "biases" as being like taste in music or clothes.
Disclaimer: I have no experience of trying to teach this stuff to anyone.
I had sort of forgotten that "bias" could be taste in music or differential human outcomes based on "biased" treatment. Noticing that collision was helpful to me.
Also, I think there is an interesting quirk in the LW/local usage of the term "bias" and its general stance towards epistemology. The local culture is really really into "overcoming biases" with a zeal and cultural functionality that has echoes in the Christian doctrine of Original Sin.
(Not that this is bad! Assuming that people are in cognitive error by default because of biases is useful for getting people to actually listen with some measure of generosity to inferentially distant third parties and teachers and so on. Also, the "biases" framing powers a pretty good sales pitch for learning about meta-cognition because loss aversion is a known bias that people who need meta-cognitive training probably have. Given knowledge of loss aversion, you should naively expect people who need a rationality upgrade to be about three times more interested in avoiding cognitive downsides as compared to their enthusiasm for cognitive upgrades. The very name of the website "le...
They basically stopped short of calling the scientific method a cultural construct
I had this problem recently too, and my solution was to not mention "science" in and of itself, but mention heuristics based on probability. It's much harder to argue that math is a social construct. If you can explain how biases fail using probability theory it might go over a lot better.
I think speaking in terms of probabilities also clears up a lot of epistemological confusion. "Magical" thinkers tend to believe that a lack of absolute certainty is more or less equivalent to total uncertainty (I know I did). At the same time, they'll understand that a 50% chance is not a 99% chance even though neither of them is 100% certain. It might also be helpful to point out all the things they are intuitively very certain of (that the sun will rise, that the floor will not cave in, that the carrot they put in their mouth will taste like carrots always do) but don't have absolute certainty of. I think it's important to make clear that you agree with them that we don't have absolute certainty of anything and instead shift the focus toward whether absolute certainty is really necessary in order to make decisions or claim that we "know" things.
I'm generally skeptical of lecturing as a method for teaching anything. Find or invent a game where victory hinges on understanding some basic principle you want to teach, and have the club play it.
Perhaps you did not pick the correct biases? Remember the way that Rational!Harry convinced Draco Malfoy? You need to start small. First produce a belief that your methods lead to more correct results, then build up more and more biases and have them create a more accurate picture of the world. Then when you have a critical mass, go for a bias that is more central to their worldview. You need your ideas to be strong enough to win the inevitable contradiction war.
Consider that LessWrong is a self selected community. People come here, read the site, and only...
I think your post is quite ironic. You start by saying that you explicitly tried to teach them that to first detect biases in yourself and then in other people. Then you say how they got it all wrong without any investigation of whether your own beliefs might need updating.
You confuse the quest for reductionist with the quest for bias free thinking. Those two are different projects. Nobody gives you a good Anki deck for rationality because there nobody around who reduced rationality to atomic concepts that's you could stuff into an Anki deck. Most people...
I don't think the problem is magical thinkers, it's probably (as Luke said) that bias has more than one meaning.
It might be worth exploring how you tell when a behavior is a matter of harmless variations in custom and when some behaviors are better than others (identifying when a project is no longer worth pursuing rather than just assuming it should be completed).
the discussion changed from an explanation of the attribution bias into a series of multicultural examples in favor of moral relativity.
You should characterize such discussions as "advanced", and briefly comment on the major emotional, social, and status biases that go into such questions. When they have some understanding of their cognitive biases around questions of facts that they have no emotional investment in, then you can start talking about social and value laden biases, and maybe try some discussions where they are operative.
It's your party. They're the guests. When people are talking off topic, politely inform them of the agenda and move on.
I would just make the point that nothing wrong with the idea that different people's cultures and traditions and lifestyles are equally valid; cultural relativity does make sense in it's own context (there's nothing inherently better about living in a small middle class suburb with 1.5 kids a dog and two cars). But just make the point that objective reality is not relative; when you're talking about the universe itself and objective reality, it is something that simply is. There was one great quote, "reality is that which continues to exist even i...
I'm afraid I haven't properly designed the Muggles Studies course I introduced at my local Harry Potter fan club. Last Sunday we finally had our second class (after wasted months of insistence and delays), and I introduced some very basic descriptions of common biases, while of course emphasizing the need to detect them in ourselves before trying to detect them in other people. At some point, which I didn't completely notice, the discussion changed from an explanation of the attribution bias into a series of multicultural examples in favor of moral relativity. I honestly don't know how that happened, but as more and more attendants voiced their comments, I started to fear someone would irreversibly damage the lessons I was trying to teach. They basically stopped short of calling the scientific method a cultural construct, at which point I'm sure I would have snapped. I don't know what to make of this. Some part of me tries to encourage me and make me put more effort into showing these people the need for more reductionism in their worldview, but another part of me just wants to give them up as hopeless postmodernists. What should I do?